Book a Demo!
CoCalc Logo Icon
StoreFeaturesDocsShareSupportNewsAboutPoliciesSign UpSign In
debakarr
GitHub Repository: debakarr/machinelearning
Path: blob/master/Part 8 - Deep Learning/Convolutional Neural Networks/[Python] Convolutional Neural Networks.ipynb
1009 views
Kernel: Python 3

Convolutional Neural Networks


Data Preprocessing is done manually.

In our case, we:

  • We created a dataset/ folder

  • Then created training_set/ and test_set/ subfolders inside dataset/ folder

  • Inside training_set/ and test_set/ subfolders we created cats/ and dogs/ subfolders

  • Then we have placed 4000 cat pictures index 1-4000 in dataset/training_set/cats

  • Then we have placed 4000 dog pictures index 1-4000 in dataset/training_set/dogs

  • Then we have placed 1000 cat pictures index 4001-5000 in dataset/test_set/cats

  • Then we have placed 1000 dog pictures index 4001-5000 in dataset/test_set/dogs

So now we have 4000 training examples for each class, and 1000 test examples for each class.

Our directory structure is something like this:

dataset/ training_set/ dogs/ dog.1.jpg dog.2.jpg ... cats/ cat.1.jpg cat.2.jpg ... test_set/ dogs/ dog.4001.jpg dog.4002.jpg ... cats/ cat.4001.jpg cat.4002.jpg ...

If you want a bigger dataset then go here: [kaggle] Dogs vs. Cats(Create an algorithm to distinguish dogs from cats)


Additional Note: Fitting CNN to the image takes a lot of time and processing power. It would be better if you use tensorflow-gpu. Alternatively you can use any cloud service to train your network on. I personally used FloydHub. There are other alternative too such as AWS, Google Cloud, Paperspace etc. I will link few of them below.


Importing the Keras libraries and packages

from keras.models import Sequential from keras.layers import Conv2D from keras.layers import MaxPooling2D from keras.layers import Flatten from keras.layers import Dense
Using TensorFlow backend.

Initialising the CNN

classifier = Sequential()

Step 1 - Convolution

classifier.add(Conv2D(32, (3, 3), input_shape=(64, 64, 3), activation="relu"))

Step 2 - Pooling

classifier.add(MaxPooling2D(pool_size = (2, 2)))

Adding a second convolutional layer followed by pooling

classifier.add(Conv2D(32, (3, 3), activation="relu")) classifier.add(MaxPooling2D(pool_size = (2, 2)))

Step 3 - Flattening

classifier.add(Flatten())

Step 4 - Full connection

classifier.add(Dense(activation="relu", units=128)) classifier.add(Dense(activation="sigmoid", units=1))

Compiling the CNN

classifier.compile(optimizer = 'adam', loss = 'binary_crossentropy', metrics = ['accuracy'])

Part 2 - Fitting the CNN to the images

from keras.preprocessing.image import ImageDataGenerator train_datagen = ImageDataGenerator(rescale = 1./255, shear_range = 0.2, zoom_range = 0.2, horizontal_flip = True) test_datagen = ImageDataGenerator(rescale = 1./255) training_set = train_datagen.flow_from_directory('/my_data/dataset/training_set', target_size = (64, 64), batch_size = 32, class_mode = 'binary') test_set = test_datagen.flow_from_directory('/my_data/dataset/test_set', target_size = (64, 64), batch_size = 32, class_mode = 'binary') classifier.fit_generator(training_set, epochs = 25, validation_data = test_set, validation_steps = 2000, steps_per_epoch=250)
Using TensorFlow backend.
Found 8000 images belonging to 2 classes. Found 2000 images belonging to 2 classes. Epoch 1/25 250/250 [==============================] - 231s - loss: 0.6863 - acc: 0.5567 - val_loss: 0.6840 - val_acc: 0.5528 Epoch 2/25 250/250 [==============================] - 187s - loss: 0.6480 - acc: 0.6283 - val_loss: 0.6220 - val_acc: 0.6745 Epoch 3/25 250/250 [==============================] - 187s - loss: 0.5852 - acc: 0.6909 - val_loss: 0.5571 - val_acc: 0.7184 Epoch 4/25 250/250 [==============================] - 186s - loss: 0.5490 - acc: 0.7192 - val_loss: 0.5198 - val_acc: 0.7564 Epoch 5/25 250/250 [==============================] - 186s - loss: 0.5160 - acc: 0.7424 - val_loss: 0.5173 - val_acc: 0.7384 Epoch 6/25 250/250 [==============================] - 187s - loss: 0.4983 - acc: 0.7530 - val_loss: 0.4937 - val_acc: 0.7512 Epoch 7/25 250/250 [==============================] - 186s - loss: 0.4815 - acc: 0.7645 - val_loss: 0.5162 - val_acc: 0.7542 Epoch 8/25 250/250 [==============================] - 187s - loss: 0.4633 - acc: 0.7740 - val_loss: 0.4748 - val_acc: 0.7775 Epoch 9/25 250/250 [==============================] - 187s - loss: 0.4426 - acc: 0.7883 - val_loss: 0.4785 - val_acc: 0.7754 Epoch 10/25 250/250 [==============================] - 186s - loss: 0.4287 - acc: 0.7985 - val_loss: 0.4653 - val_acc: 0.7910 Epoch 11/25 250/250 [==============================] - 187s - loss: 0.4228 - acc: 0.7988 - val_loss: 0.4506 - val_acc: 0.7892 Epoch 12/25 250/250 [==============================] - 186s - loss: 0.4095 - acc: 0.8139 - val_loss: 0.4646 - val_acc: 0.7879 Epoch 13/25 250/250 [==============================] - 187s - loss: 0.3999 - acc: 0.8183 - val_loss: 0.4876 - val_acc: 0.7860 Epoch 14/25 250/250 [==============================] - 187s - loss: 0.3766 - acc: 0.8283 - val_loss: 0.4700 - val_acc: 0.7891 Epoch 15/25 250/250 [==============================] - 186s - loss: 0.3657 - acc: 0.8390 - val_loss: 0.4514 - val_acc: 0.8093 Epoch 16/25 250/250 [==============================] - 187s - loss: 0.3545 - acc: 0.8425 - val_loss: 0.4668 - val_acc: 0.7999 Epoch 17/25 250/250 [==============================] - 187s - loss: 0.3497 - acc: 0.8442 - val_loss: 0.4651 - val_acc: 0.7965 Epoch 18/25 250/250 [==============================] - 187s - loss: 0.3263 - acc: 0.8598 - val_loss: 0.4673 - val_acc: 0.8044 Epoch 19/25 250/250 [==============================] - 187s - loss: 0.3163 - acc: 0.8635 - val_loss: 0.4659 - val_acc: 0.8103 Epoch 20/25 250/250 [==============================] - 187s - loss: 0.2997 - acc: 0.8707 - val_loss: 0.4745 - val_acc: 0.8096 Epoch 21/25 250/250 [==============================] - 186s - loss: 0.2863 - acc: 0.8751 - val_loss: 0.5066 - val_acc: 0.7871 Epoch 22/25 250/250 [==============================] - 187s - loss: 0.2821 - acc: 0.8746 - val_loss: 0.4680 - val_acc: 0.8089 Epoch 23/25 250/250 [==============================] - 186s - loss: 0.2604 - acc: 0.8895 - val_loss: 0.4885 - val_acc: 0.8130 Epoch 24/25 250/250 [==============================] - 187s - loss: 0.2497 - acc: 0.8962 - val_loss: 0.4733 - val_acc: 0.8087 Epoch 25/25 250/250 [==============================] - 186s - loss: 0.2414 - acc: 0.8988 - val_loss: 0.5130 - val_acc: 0.8076
<keras.callbacks.History at 0x7fae3a700400>